Markov Chain

A special kind of stochastic process.

Definition

$$ \begin{aligned} P[X_{n+1} = x_{n+1} | X_{n} = x_{n}, \ldots, X_{1} = x_1] &= P[X_{n+1} = x_{n+1} | X_{n} = x_{n}] \\ \forall x^{n+1} \in \mathcal{X}^{n+1} \end{aligned} $$

Property

Time-Invariance

$$ \begin{aligned} P[X_{n + 1} = b | X_{n} = a] &= P[X_{2} = b | X_{1} = a] \\ \forall a, b &\in \mathcal{X}, \forall n \end{aligned} $$

Matrix Representation

Stationary Distribution

A distribution $\mu$ is a stationary if: $$ \begin{aligned} \mu = \mu \cdot P \end{aligned} $$

Specification

Everything needed to specify a time-invariant Markov Chain:

Theorems

$$ \begin{aligned} \mu &= \left( \lim_{n \rightarrow \infty} P_{ij}^{n} \right) \cdot \mu_0, &\forall \mu_0 \\ \end{aligned} $$ $$ \begin{aligned} H(\mathcal{X}) &= H(X_2 | X_1) \end{aligned} $$

by Jon